Recursive Algorithm for L1 Norm Estimation in Linear Models

نویسندگان

  • A. Khodabandeh
  • A. R. Amiri-Simkooei
چکیده

L1 norm estimator has been widely used as a robust parameter estimation method for outlier detection. Different algorithms have been applied for L1 norm minimization among which the linear programming problem based on the simplex method is well known. In the present contribution, in order to solve an L1 norm minimization problem in a linear model, an interior point algorithm is developed which is based on Dikin’s method. The method can be considered as an appropriate alternative for the classical simplex method, which is sometimes time-consuming. The proposed method, compared with the simplex method, is thus easier for implementation and faster in performance. Furthermore, a recursive form of the Dikin’s method is derived, which resembles the recursive least-squares method. Two simulated numerical examples show that the proposed algorithm gives as accurate results as the simplex method but in considerably less time. When dealing with a large number of observations, this algorithm can thus be used instead of the iteratively reweighted least-squares method and the simplex method. DOI: 10.1061/ ASCE SU.1943-5428.0000031 CE Database subject headings: Least squares method; Computer programming; Algorithms; Surveys; Estimation. Author keywords: Iteratively reweighted least squares; Linear programming problem; Interior point methods; Simplex method; Affine method of Dikin; Robust estimation; L1 norm minimization; Recursive algorithm.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An L1-norm method for generating all of efficient solutions of multi-objective integer linear programming problem

This paper extends the proposed method by Jahanshahloo et al. (2004) (a method for generating all the efficient solutions of a 0–1 multi-objective linear programming problem, Asia-Pacific Journal of Operational Research). This paper considers the recession direction for a multi-objective integer linear programming (MOILP) problem and presents necessary and sufficient conditions to have unbounde...

متن کامل

Large-scale Inversion of Magnetic Data Using Golub-Kahan Bidiagonalization with Truncated Generalized Cross Validation for Regularization Parameter Estimation

In this paper a fast method for large-scale sparse inversion of magnetic data is considered. The L1-norm stabilizer is used to generate models with sharp and distinct interfaces. To deal with the non-linearity introduced by the L1-norm, a model-space iteratively reweighted least squares algorithm is used. The original model matrix is factorized using the Golub-Kahan bidiagonalization that proje...

متن کامل

Adaptive Estimation of Sparse Signals : where RLS meets the l 1 - norm †

Using the l1-norm to regularize the least-squares criterion, the batch least-absolute shrinkage and selection operator (Lasso) has well-documented merits for estimating sparse signals of interest emerging in various applications where observations adhere to parsimonious linear regression models. To cope with high complexity, increasing memory requirements, and lack of tracking capability that b...

متن کامل

Application of Recursive Least Squares to Efficient Blunder Detection in Linear Models

In many geodetic applications a large number of observations are being measured to estimate the unknown parameters. The unbiasedness property of the estimated parameters is only ensured if there is no bias (e.g. systematic effect) or falsifying observations, which are also known as outliers. One of the most important steps towards obtaining a coherent analysis for the parameter estimation is th...

متن کامل

Online adaptive estimation of sparse signals: where RLS meets the l1-norm

Using the -norm to regularize the least-squares criterion, the batch least-absolute shrinkage and selection operator (Lasso) has well-documented merits for estimating sparse signals of interest emerging in various applications where observations adhere to parsimonious linear regression models. To cope with high complexity, increasing memory requirements, and lack of tracking capability that bat...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2010